-
Couldn't load subscription status.
- Fork 22
Better LLM Explanations CF-597 #125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
PR Reviewer Guide 🔍(Review updated until commit 08aed91)Here are some key observations to aid the review process:
|
PR Code Suggestions ✨Explore these optional code suggestions:
|
|
Persistent review updated to latest commit 08aed91 |
PR Code Suggestions ✨Explore these optional code suggestions:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
incorporated all the feedback
User description
wip
PR Type
Enhancement
Description
Add
get_new_explanationmethod in AIServiceIntegrate explanation API call in function optimizer
Construct fallback logic for explanation
Handle request errors and log appropriately
Diagram Walkthrough
File Walkthrough
aiservice.py
Implement explanation endpoint in AIServicecodeflash/api/aiservice.py
get_new_explanationmethodfunction_optimizer.py
Integrate AI explanations into optimizercodeflash/optimization/function_optimizer.py
get_new_explanationin optimizer workflowdataclasses.replacefor Explanationnew_explanationwith fallback raw messagecheck_create_pr